correlated variables - traducción al ruso
Diclib.com
Diccionario en línea

correlated variables - traducción al ruso

ANY STATISTICAL RELATIONSHIP, WHETHER CAUSAL OR NOT, BETWEEN TWO RANDOM VARIABLES OR BIVARIATE DATA
Statistical correlation; Association (statistics); Positive correlation; Correlations; Correlated; Correlation matrix; Correlate; Correlational research; Correlation (statistics); Correlation (in statistics); Linear correlation; Coorelation coeficient; Simple correlation; Corelation; Stratified analysis; Correlated variables; Direct correlation; Linear relationship; Sample correlation; Correlation & dependence; Correlational Design; Statistical association; Correlational data; Correlation structure; Correlation structures; Positively correlated; Correlation and dependence; Correlation matrices
  • [[Anscombe's quartet]]: four sets of data with the same correlation of 0.816
  • Several sets of (''x'', ''y'') points, with the [[Pearson correlation coefficient]] of ''x'' and ''y'' for each set. The correlation reflects the noisiness and direction of a linear relationship (top row), but not the slope of that relationship (middle), nor many aspects of nonlinear relationships (bottom). N.B.: the figure in the center has a slope of 0 but in that case, the correlation coefficient is undefined because the variance of ''Y'' is zero.
  • Example scatterplots of various datasets with various correlation coefficients.

correlated variables         

общая лексика

коррелированные величины

correlated variables         
коррелированные переменные
simple correlation         

математика

простая корреляция

Definición

free variable
1. A variable referred to in a function, which is not an argument of the function. In lambda-calculus, x is a {bound variable} in the term M = x . T, and a free variable of T. We say x is bound in M and free in T. If T contains a subterm x . U then x is rebound in this term. This nested, inner binding of x is said to "shadow" the outer binding. Occurrences of x in U are free occurrences of the new x. Variables bound at the top level of a program are technically free variables within the terms to which they are bound but are often treated specially because they can be compiled as fixed addresses. Similarly, an identifier bound to a recursive function is also technically a free variable within its own body but is treated specially. A closed term is one containing no free variables. See also closure, lambda lifting, scope. 2. In logic, a variable which is not quantified (see quantifier).

Wikipedia

Correlation

In statistics, correlation or dependence is any statistical relationship, whether causal or not, between two random variables or bivariate data. Although in the broadest sense, "correlation" may indicate any type of association, in statistics it usually refers to the degree to which a pair of variables are linearly related. Familiar examples of dependent phenomena include the correlation between the height of parents and their offspring, and the correlation between the price of a good and the quantity the consumers are willing to purchase, as it is depicted in the so-called demand curve.

Correlations are useful because they can indicate a predictive relationship that can be exploited in practice. For example, an electrical utility may produce less power on a mild day based on the correlation between electricity demand and weather. In this example, there is a causal relationship, because extreme weather causes people to use more electricity for heating or cooling. However, in general, the presence of a correlation is not sufficient to infer the presence of a causal relationship (i.e., correlation does not imply causation).

Formally, random variables are dependent if they do not satisfy a mathematical property of probabilistic independence. In informal parlance, correlation is synonymous with dependence. However, when used in a technical sense, correlation refers to any of several specific types of mathematical operations between the tested variables and their respective expected values. Essentially, correlation is the measure of how two or more variables are related to one another. There are several correlation coefficients, often denoted ρ {\displaystyle \rho } or r {\displaystyle r} , measuring the degree of correlation. The most common of these is the Pearson correlation coefficient, which is sensitive only to a linear relationship between two variables (which may be present even when one variable is a nonlinear function of the other). Other correlation coefficients – such as Spearman's rank correlation – have been developed to be more robust than Pearson's, that is, more sensitive to nonlinear relationships. Mutual information can also be applied to measure dependence between two variables.